2,997 research outputs found
Inference for covariate adjusted regression via varying coefficient models
We consider covariate adjusted regression (CAR), a regression method for
situations where predictors and response are observed after being distorted by
a multiplicative factor. The distorting factors are unknown functions of an
observable covariate, where one specific distorting function is associated with
each predictor or response. The dependence of both response and predictors on
the same confounding covariate may alter the underlying regression relation
between undistorted but unobserved predictors and response. We consider a class
of highly flexible adjustment methods for parameter estimation in the
underlying regression model, which is the model of interest. Asymptotic
normality of the estimates is obtained by establishing a connection to varying
coefficient models. These distribution results combined with proposed
consistent estimates of the asymptotic variance are used for the construction
of asymptotic confidence intervals for the regression coefficients. The
proposed approach is illustrated with data on serum creatinine, and finite
sample properties of the proposed procedures are investigated through a
simulation study.Comment: Published at http://dx.doi.org/10.1214/009053606000000083 in the
Annals of Statistics (http://www.imstat.org/aos/) by the Institute of
Mathematical Statistics (http://www.imstat.org
Principal Component Analysis for Functional Data on Riemannian Manifolds and Spheres
Functional data analysis on nonlinear manifolds has drawn recent interest.
Sphere-valued functional data, which are encountered for example as movement
trajectories on the surface of the earth, are an important special case. We
consider an intrinsic principal component analysis for smooth Riemannian
manifold-valued functional data and study its asymptotic properties. Riemannian
functional principal component analysis (RFPCA) is carried out by first mapping
the manifold-valued data through Riemannian logarithm maps to tangent spaces
around the time-varying Fr\'echet mean function, and then performing a
classical multivariate functional principal component analysis on the linear
tangent spaces. Representations of the Riemannian manifold-valued functions and
the eigenfunctions on the original manifold are then obtained with exponential
maps. The tangent-space approximation through functional principal component
analysis is shown to be well-behaved in terms of controlling the residual
variation if the Riemannian manifold has nonnegative curvature. Specifically,
we derive a central limit theorem for the mean function, as well as root-
uniform convergence rates for other model components, including the covariance
function, eigenfunctions, and functional principal component scores. Our
applications include a novel framework for the analysis of longitudinal
compositional data, achieved by mapping longitudinal compositional data to
trajectories on the sphere, illustrated with longitudinal fruit fly behavior
patterns. RFPCA is shown to be superior in terms of trajectory recovery in
comparison to an unrestricted functional principal component analysis in
applications and simulations and is also found to produce principal component
scores that are better predictors for classification compared to traditional
functional functional principal component scores
Varying-coefficient functional linear regression
Functional linear regression analysis aims to model regression relations
which include a functional predictor. The analog of the regression parameter
vector or matrix in conventional multivariate or multiple-response linear
regression models is a regression parameter function in one or two arguments.
If, in addition, one has scalar predictors, as is often the case in
applications to longitudinal studies, the question arises how to incorporate
these into a functional regression model. We study a varying-coefficient
approach where the scalar covariates are modeled as additional arguments of the
regression parameter function. This extension of the functional linear
regression model is analogous to the extension of conventional linear
regression models to varying-coefficient models and shares its advantages, such
as increased flexibility; however, the details of this extension are more
challenging in the functional case. Our methodology combines smoothing methods
with regularization by truncation at a finite number of functional principal
components. A practical version is developed and is shown to perform better
than functional linear regression for longitudinal data. We investigate the
asymptotic properties of varying-coefficient functional linear regression and
establish consistency properties.Comment: Published in at http://dx.doi.org/10.3150/09-BEJ231 the Bernoulli
(http://isi.cbs.nl/bernoulli/) by the International Statistical
Institute/Bernoulli Society (http://isi.cbs.nl/BS/bshome.htm
- …